3,706 research outputs found

    A cautionary note on robust covariance plug-in methods

    Full text link
    Many multivariate statistical methods rely heavily on the sample covariance matrix. It is well known though that the sample covariance matrix is highly non-robust. One popular alternative approach for "robustifying" the multivariate method is to simply replace the role of the covariance matrix with some robust scatter matrix. The aim of this paper is to point out that in some situations certain properties of the covariance matrix are needed for the corresponding robust "plug-in" method to be a valid approach, and that not all scatter matrices necessarily possess these important properties. In particular, the following three multivariate methods are discussed in this paper: independent components analysis, observational regression and graphical modeling. For each case, it is shown that using a symmetrized robust scatter matrix in place of the covariance matrix results in a proper robust multivariate method.Comment: 24 pages, 7 figure

    On the eigenvalues of the spatial sign covariance matrix in more than two dimensions

    Get PDF
    Acknowledgments Alexander Dürre was supported in part by the Collaborative Research Grant 823 of the German Research Foundation. David E. Tyler was supported in part by the National Science Foundation grant DMS-1407751. A visit of Daniel Vogel to David E. Tyler was supported by a travel grant from the Scottish Universities Physics Alliance. The authors are grateful to the editors and referees for their constructive comments.Non peer reviewedPostprin

    Tools for Exploring Multivariate Data: The Package ICS

    Get PDF
    Invariant coordinate selection (ICS) has recently been introduced as a method for exploring multivariate data. It includes as a special case a method for recovering the unmixing matrix in independent components analysis (ICA). It also serves as a basis for classes of multivariate nonparametric tests, and as a tool in cluster analysis or blind discrimination. The aim of this paper is to briefly explain the (ICS) method and to illustrate how various applications can be implemented using the R package ICS. Several examples are used to show how the ICS method and ICS package can be used in analyzing a multivariate data set.

    Asymptotic and bootstrap tests for the dimension of the non-Gaussian subspace

    Full text link
    Dimension reduction is often a preliminary step in the analysis of large data sets. The so-called non-Gaussian component analysis searches for a projection onto the non-Gaussian part of the data, and it is then important to know the correct dimension of the non-Gaussian signal subspace. In this paper we develop asymptotic as well as bootstrap tests for the dimension based on the popular fourth order blind identification (FOBI) method

    On the maximum bias functions of MM-estimates and constrained M-estimates of regression

    Full text link
    We derive the maximum bias functions of the MM-estimates and the constrained M-estimates or CM-estimates of regression and compare them to the maximum bias functions of the S-estimates and the Ï„\tau-estimates of regression. In these comparisons, the CM-estimates tend to exhibit the most favorable bias-robustness properties. Also, under the Gaussian model, it is shown how one can construct a CM-estimate which has a smaller maximum bias function than a given S-estimate, that is, the resulting CM-estimate dominates the S-estimate in terms of maxbias and, at the same time, is considerably more efficient.Comment: Published at http://dx.doi.org/10.1214/009053606000000975 in the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org
    • …
    corecore